Multi-block Nonconvex Nonsmooth Proximal ADMM: Convergence and Rates Under Kurdyka–?ojasiewicz Property

نویسندگان

چکیده

We study the convergence and rates of a multi-block proximal alternating direction method multipliers (PADMM) for solving linearly constrained separable nonconvex nonsmooth optimization problems. This algorithm is an important variant (ADMM) which includes term in each subproblem, to cancel out complicated terms applications where subproblems are not easy solve or do admit simple closed form solution. consider over-relaxation step size dual update provide detailed proof any \(\beta \in (0,2)\). prove sequence generated by PADMM showing that it has finite length Cauchy. Under powerful Kurdyka–?ojasiewicz (K?) property, we establish values iterates, show various K?-exponent associated with objective function can raise three different rates. More precisely, if exponent \(\theta =0\), converges numbers iterations. If (0,1/2]\), then sequential rate \(cQ^{k}\) \(c>0\), \(Q\in (0,1)\), \(k\in {\mathbb {N}}\) iteration number. (1/2,1]\), \({\mathcal {O}}(1/k^{r})\) \(r=(1-\theta )/(2\theta -1)\) achieved.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Global Convergence of ADMM in Nonconvex Nonsmooth Optimization

In this paper, we analyze the convergence of the alternating direction method of multipliers (ADMM) for minimizing a nonconvex and possibly nonsmooth objective function, φ(x0, . . . , xp, y), subject to coupled linear equality constraints. Our ADMM updates each of the primal variables x0, . . . , xp, y, followed by updating the dual variable. We separate the variable y from xi’s as it has a spe...

متن کامل

Parallel Multi-Block ADMM with o(1 / k) Convergence

This paper introduces a parallel and distributed extension to the alternating direction method of multipliers (ADMM) for solving convex problem: minimize f1(x1) + ∙ ∙ ∙ + fN (xN ) subject to A1x1 + ∙ ∙ ∙ + ANxN = c, x1 ∈ X1, . . . , xN ∈ XN . The algorithm decomposes the original problem into N smaller subproblems and solves them in parallel at each iteration. This Jacobian-type algorithm is we...

متن کامل

On the Convergence Rate of Multi-Block ADMM

The alternating direction method of multipliers (ADMM) is widely used in solving structured convex optimization problems. Despite of its success in practice, the convergence properties of the standard ADMM for minimizing the sum of N (N ≥ 3) convex functions with N block variables linked by linear constraints, have remained unclear for a very long time. In this paper, we present convergence and...

متن کامل

On the Sublinear Convergence Rate of Multi-Block ADMM

The alternating direction method of multipliers (ADMM) is widely used in solving structured convex optimization problems. Despite of its success in practice, the convergence of the standard ADMM for minimizing the sum of N (N ≥ 3) convex functions whose variables are linked by linear constraints, has remained unclear for a very long time. Recently, Chen et al. [4] provided a counterexample show...

متن کامل

Linear Convergence of Proximal Gradient Algorithm with Extrapolation for a Class of Nonconvex Nonsmooth Minimization Problems

In this paper, we study the proximal gradient algorithm with extrapolation for minimizing the sum of a Lipschitz differentiable function and a proper closed convex function. Under the error bound condition used in [19] for analyzing the convergence of the proximal gradient algorithm, we show that there exists a threshold such that if the extrapolation coefficients are chosen below this threshol...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Optimization Theory and Applications

سال: 2021

ISSN: ['0022-3239', '1573-2878']

DOI: https://doi.org/10.1007/s10957-021-01919-7